--- title: Calibration Procedure keywords: fastai sidebar: home_sidebar summary: "Every OpenHSI camera is unique and requires calibration before use. This module provides the abstractions to create the calibration data which are then used in operation. " description: "Every OpenHSI camera is unique and requires calibration before use. This module provides the abstractions to create the calibration data which are then used in operation. " nb_path: "nbs/05_calibrate.ipynb" ---
{% raw %}
{% endraw %}

{% include tip.html content='This module can be imported using from openhsi.calibrate import *' %}

{% raw %}
{% endraw %} {% raw %}
{% endraw %} {% raw %}

sum_gaussians[source]

sum_gaussians(x:array, *args:amplitude, peak position, peak width, constant)

{% endraw %} {% raw %}
{% endraw %} {% raw %}

class SettingsBuilderMixin[source]

SettingsBuilderMixin()

{% endraw %} {% raw %}
{% endraw %} {% raw %}

class SettingsBuilderMetaclass[source]

SettingsBuilderMetaclass(clsname:str, cam_class, attrs) :: type

type(object_or_name, bases, dict)
type(object) -> the object's type
type(name, bases, dict) -> a new type
{% endraw %} {% raw %}

create_settings_builder[source]

create_settings_builder(clsname:str, cam_class:Camera Class)

Create a `SettingsBuilder` class called `clsname` based on your chosen `cam_class`.
{% endraw %} {% raw %}
{% endraw %}

Using the SettingsBuilderMixin

There are a few ways to create a SettingsBuilder class that words for your custom camera. (They involve Python metaclasses and mixins)

For example, you can then create a SettingsBuilder class that works for your custom camera by doing one of the following. {% include note.html content='Below we use the ’SimulatedCamera’ class. When calibrating a real camera you would replace SimulatedCamera with your camera class.' %}

{% raw %}
SettingsBuilder = create_settings_builder("SettingsBuilder",SimulatedCamera)

# using Metaclasses
SettingsBuilder = SettingsBuilderMetaclass("SettingsBuilder",SimulatedCamera,{})

# initialising
sb = SettingsBuilder(json_path="assets/cam_settings.json", 
                     pkl_path="assets/cam_calibration.pkl")
Allocated 69.93 MB of RAM.
{% endraw %} {% raw %}
class CalibrateOpenHSI(SettingsBuilderMixin, SimulatedCamera):
    pass

sb = CalibrateOpenHSI(mode="flat",json_path="assets/cam_settings.json", pkl_path="assets/cam_calibration.pkl")
Allocated 69.93 MB of RAM.
{% endraw %}

Find illuminated sensor area

We assume the x axis (or detector columns) are used for the spectral channels, the rows correspond to the cross-track dimension and are limited by the optics (slit). The useable area is cropped out (windowing can also be used to reduce data intake).

{% raw %}

SettingsBuilderMixin.retake_flat_field[source]

SettingsBuilderMixin.retake_flat_field(show:bool=True)

Take and store an image of with the OpenHSI slit illuminated but a uniform light source.

Keyword arguments:

    show -- flag to show holowview plot of image.

SettingsBuilderMixin.update_row_minmax[source]

SettingsBuilderMixin.update_row_minmax(edgezone:int=4, show=True)

Find edges of slit in flat field images and determine region to crop

Keyword arguments:
    edgezone -- number of pixel buffer to add to crop region (default 4).
    show -- flag to show holowview plot of slice and edges identified.
{% endraw %} {% raw %}
hvimg=sb.retake_flat_field(show=True)
hvimg.opts(width=400,height=400)
print(sb.calibration["flat_field_pic"].max())
hvimg
255
{% endraw %} {% raw %}
sb.update_row_minmax()
Locs row_min: 7 and row_max: 912
{% endraw %} {% raw %}
sb.update_resolution()
{% endraw %}

Smile Correction

The emissions lines, which should be straight vertical, appear slightly curved. This is smile error (error in the spectral dimension).

{% raw %}
sb.mode_change("HgAr")
{% endraw %} {% raw %}

SettingsBuilderMixin.retake_HgAr[source]

SettingsBuilderMixin.retake_HgAr(show:bool=True, nframes:int=10)

Take and store an image with OpenHSI illuminated but HgAr calibration source.

Keyword arguments:

    show -- flag to show holowview plot of image.
    nframes -- number of frames to average for image (default 10).

SettingsBuilderMixin.update_smile_shifts[source]

SettingsBuilderMixin.update_smile_shifts(show=True)

Determine Smile and shifts to correct from HgAr image.

Keyword arguments:
    show -- flag to show holowview plot of slice and edges identified.
{% endraw %} {% raw %}
hvimg=sb.retake_HgAr(show=True, nframes=1)
hvimg.opts(width=400,height=400)
print(sb.calibration["HgAr_pic"].max())
hvimg
255.0
{% endraw %} {% raw %}
sb.update_smile_shifts()
{% endraw %}

Map the spectral axis to wavelengths

To do this, peaks in the HgAr spectrum are found, refined by curve-fitting with Gaussians. The location of the peaks then allow for interpolation to get the map from array (column) index to wavelength (nm).

{% raw %}

SettingsBuilderMixin.fit_HgAr_lines[source]

SettingsBuilderMixin.fit_HgAr_lines(top_k:int=10, brightest_peaks:list=[435.833, 546.074, 763.511], filter_window:int=1, interactive_peak_id:bool=False, find_peaks_height:int=10, prominence:float=0.2, width:float=1.5, distance:int=10, max_match_error:float=2.0, verbose:bool=False)

Finds the index to wavelength map given a spectra and a list of emission lines.
To filter the spectra, set `filter_window` to an odd number > 1.

Keyword arguments:
brightest_peaks -- list of wavelength for the brightest peaks in HgAr image.
filter_window -- filter window for scipy.signal.savgol_filter
interactive_peak_id -- flag to interactively confirm wavelength of peaks
find_peaks_height, prominence, width, distance -- inputs for scipy.signal.find_peaks
max_match_error -- maximum diffeence between peak estimate wavelength and wavelength of HgAr linelist.
verbose -- more detailed diagnostic printing.
{% endraw %} {% raw %}
sb.fit_HgAr_lines(top_k=10)
{% endraw %}

Each column in our camera frame (after smile correction) corresponds to a particular wavelength. The interpolation between column index and wavelength is slightly nonlinear which is to be expected from the diffraction grating - however it is linear to good approximation. Applying a linear interpolation gives an absolute error of $\pm$3 nm whereas the a cubic interpolation used here gives an absolute error of $\pm$ 0.3 nm (approximately the spacing between each column). Using higher order polynomials doesn't improve the error due to overfitting.

For fast real time processing, the fast binning procedure assumes a linear interpolation because the binning algorithm consists of a single broadcasted summation with no additional memory allocation overhead. A slower more accurate spectral binning procedure is also provided using the cubic interpolation described here and requires hundreds of temporary arrays to be allocated each time. Binning can also be done in post processing after collecting raw data.

{% raw %}

SettingsBuilderMixin.update_intsphere_fit[source]

SettingsBuilderMixin.update_intsphere_fit(spec_rad_ref_data='assets/112704-1-1_1nm_data.csv', spec_rad_ref_luminance:int=52020, showplot=True)

{% endraw %} {% raw %}
fig = sb.update_intsphere_fit()
#sb.dump() # resave the settings and calibration files
{% endraw %}

Integrating Sphere data

4D datacube with coordinates of cross-track, wavelength, exposure, and luminance. {% include warning.html content='Needs testing!' %}

{% raw %}

SettingsBuilderMixin.update_intsphere_cube[source]

SettingsBuilderMixin.update_intsphere_cube(exposures:List[T], luminances:List[T], nframes:int=10, lum_chg_func:Callable=print, interactive:bool=False)

{% endraw %} {% raw %}
luminances=[0, 1_000, 5_000, 10_000, 20_000, 40_000]
exposures = [0, 5, 8, 10, 15, 20]
sb.calibration["rad_ref"] = cam.update_intsphere_cube(exposures, luminances, noframe=50, lum_chg_func=spt.selectPreset)

# remove saturated images
cam.calibration["rad_ref"] = cam.calibration["rad_ref"].where(
    ~(np.sum((cam.calibration["rad_ref"][:, :, :, :, :] == 255), axis=(1, 2)) > 1000)
)
{% endraw %}

When you are happy with the calibration, dump the updates.

{% raw %}
#cam.dump(json_path=json_path_target, pkl_path=pkl_path_target)
{% endraw %}

SpectraPT TCP Client

Class to interact with the Spectra PT integrating sphere.

{% raw %}

class SpectraPTController[source]

SpectraPTController(lum_preset_dict:Dict[int, int]={0: 1, 1000: 2, 2000: 3, 3000: 4, 4000: 5, 5000: 6, 6000: 7, 7000: 8, 8000: 9, 9000: 10, 10000: 11, 20000: 12, 25000: 13, 30000: 14, 35000: 15, 40000: 16}, host:str='localhost', port:int=3434)

{% endraw %} {% raw %}
{% endraw %} {% raw %}

SpectraPTController.client[source]

SpectraPTController.client(msg:str)

SpectraPTController.selectPreset[source]

SpectraPTController.selectPreset(lumtarget:float)

SpectraPTController.turnOnLamp[source]

SpectraPTController.turnOnLamp()

SpectraPTController.turnOffLamp[source]

SpectraPTController.turnOffLamp()

{% endraw %}